|
The interaction information (McGill 1954), or amounts of information (Hu Kuo Ting, 1962) or co-information (Bell 2003) is one of several generalizations of the mutual information, and expresses the amount information (redundancy or synergy) bound up in a set of variables, ''beyond'' that which is present in any subset of those variables. Unlike the mutual information, the interaction information can be either positive or negative. This confusing property has likely retarded its wider adoption as an information measure in machine learning and cognitive science. These functions, their negativity and minima have a direct interpretation in algebraic topology (Baudot & Bennequin, 2015). == The Three-Variable Case == For three variables , the interaction information is given by : where, for example, is the mutual information between variables and , and is the conditional mutual information between variables and given . Formally, : It thus follows that : For the three-variable case, the interaction information is the difference between the information shared by when has been fixed and when has not been fixed. (See also Fano's 1961 textbook.) Interaction information measures the influence of a variable on the amount of information shared between . Because the term can be zero — for example, when the dependency between is due entirely to the influence of a common cause , the interaction information can be negative as well as positive. Negative interaction information indicates that variable inhibits (i.e., ''accounts for'' or ''explains'' some of) the correlation between , whereas positive interaction information indicates that variable facilitates or enhances the correlation between . Interaction information is bounded. In the three variable case, it is bounded by : 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Interaction information」の詳細全文を読む スポンサード リンク
|